# Next Sentence Prediction
Debertav2 Base Uncased
Apache-2.0
BERT is a pre-trained language model based on the Transformer architecture, trained on English corpus through masked language modeling and next sentence prediction tasks.
Large Language Model English
D
mlcorelib
21
0
Rubert Base Cased Conversational
Russian conversational model trained on OpenSubtitles, Dirty, Pikabu, and Taiga corpus social media sections
Large Language Model Other
R
DeepPavlov
165.49k
20
Featured Recommended AI Models